2. Model Code and Configuration Description

您所在的位置:网站首页 wrf hydro 2. Model Code and Configuration Description

2. Model Code and Configuration Description

2024-07-16 14:53:57| 来源: 网络整理| 查看: 265

2. Model Code and Configuration Description

This chapter presents the technical description of the WRF-Hydro model code. The chapter is divided into the following sections:

2.1 Brief code overview

WRF-Hydro is written in a modularized, Fortran90 coding structure whose routing physics modules are switch activated through a model namelist file hydro.namelist. The code has been parallelized for execution on high-performance, parallel computing architectures including Linux operating system commodity clusters and multi-processor desktops as well as multiple supercomputers. More detailed model requirements depend on the choice of model driver, described in the next section.

2.2 Driver level description

WRF-Hydro is essentially a group of modules and functions which handle the communication of information between atmosphere components (such as WRF, CESM or prescribed meteorological analyses) and sets of land surface hydrology components. From a coding perspective the WRF-hydro system can be called from an existing architecture such as the WRF model, the CESM, NASA LIS, etc. or can run in a standalone mode with its own driver which has adapted part of the NCAR High Resolution Land Data Assimilation System (HRLDAS). Each new coupling effort requires some basic modifications to a general set of functions to manage the coupling. In WRF-Hydro, each new system that WRF-Hydro is coupled into gets assigned to a directory indicating the name of the coupling component WRF-Hydro is coupled to. For instance, the code which handles the coupling to the WRF model is contained in the WRF_cpl/ directory in the WRF-Hydro system. Similarly, the code which handles the coupling to the offline Noah land surface modeling system is contained within the Noah_cpl / directory and so on. Description of each directory is provided in Section 2.4.

The coupling structure is illustrated here, briefly, in terms of the coupling of WRF-Hydro into the WRF model. A similar approach is used for coupling the WRF-Hydro extension package into other modeling systems or for coupling other modeling systems into WRF-Hydro.

Example: For coupled WRF/WRF-Hydro runs the WRF-Hydro components are compiled as a single library function call with the WRF system. As such, a single executable is created upon compilation (wrf.exe). As illustrated in Figure 2.3 WRF-hydro is called directly from WRF in the WRF surface driver module (phys/ module_surface_driver.F). The code that manages the communication is the WRF_drv_Hydro.F interface module that is contained within the WRF_cpl/ directory. The WRF_drv_Hydro.F interface module is the specific instance of a ‘General WRF-Hydro Coupling Interface’ for the WRF model which passes data, grid and time information between WRF and WRF-Hydro. Components within WRF-Hydro then manage the dynamic regridding “data mapping” and sub-component routing functions (e.g. surface, subsurface and/or channel routing) within WRF-Hydro (see Fig. 1.1 for an illustration of components contained within WRF-Hydro).

Upon completion of the user-specified routing functions, WRF-Hydro will remap the data back to the WRF model grid and then pass the necessary variables back to the WRF model through the WRF_drv_Hydro.F interface module. Therefore, the key component of the WRF-Hydro system is the proper construction of the WRF_cpl_Hydro interface module (or more generally ‘XXX_cpl_Hydro’). Users wishing to couple new modules to WRF-Hydro will need to create a unique “General WRF-Hydro Coupling Interface” for their components. Some additional examples of this interface module are available upon request for users to build new coupling components. This simple coupling interface is similar in structure to other general model coupling interfaces such as those within the Earth System Modeling Framework (ESMF) or the Community Surface Dynamics Modeling System (CSDMS).

Figure 2.1 Schematic illustrating the coupling and calling structure of WRF-Hydro from the WRF Model.

The model code has been compiled using the PGI Fortran compiler, the Intel ‘ifort’ compiler, and the freely-available GNU Fortran compiler ‘gfortran’ for use with Unix-type operating systems on desktops, clusters, and supercomputing systems. Because the WRF-Hydro modeling system relies on netCDF input and output file conventions, netCDF Fortran libraries must be installed and properly compiled on the system upon which WRF-Hydro is to be executed. Not doing so will result in numerous error messages such as ‘…undefined reference to netCDF library …’ or similar messages upon compilation. For further installation requirements see the FAQs page of the website as well as in the How To Build & Run WRF-Hydro V5 in Standalone Mode document also available from https://ral.ucar.edu/projects/wrf_hydro. 2.3 Parallelization strategy

Parallelization of the WRF-Hydro code utilizes geographic domain decomposition and ‘halo’ array passing structures similar to those used in the WRF atmospheric model (Figures 2.2 and 2.3). Message passing between processors is accomplished using MPI protocols. Therefore the relevant MPI libraries must be installed and properly compiled on the system upon which WRF-Hydro is to be executed in parallel mode. Currently sequential compile is not supported so MPI libraries are required even if running over a single core.

Figure 2.2 Schematic of parallel domain decomposition scheme in WRF-Hydro. Boundary or ‘halo’ arrays in which memory is shared between processors (P1 and P2) are shaded in purple.

Figure 2.3 Schematic of parallel domain decomposition scheme in WRF-Hydro as applied to channel routing. Channel elements (stars) are communicated at boundaries via ‘halo’ arrays in which memory is shared between processors (P1 and P2). Black and red stars indicate overlapping channel elements used in the diffusive wave solver.

2.4 Directory Structures

The top-level directory structure of the code is provided below as nested under trunk/NDHMS and subdirectory structures are described thereafter. The tables below provide brief descriptions of the file contents of each directory where the model code resides.

Table 2.1 Description of the file contents of each directory where the model code resides.

File/directory name

Description

Main code files and directories (under version control in a GitHub repository):

Directories:

arc/

Contains macro files, which specify the compile configurations, compiler options, links to netCDF libraries, etc

CPL/Noah_cpl/

Contains the WRF-Hydro coupling interface for coupling WRF-Hydro components with the standalone (offline) Noah land surface model data assimilation and forecasting system

CPL/NoahMP_cpl/

Contains the WRF-Hydro coupling interface for coupling WRF-Hydro components with the standalone (offline) Noah-MP land surface model data assimilation and forecasting system

CPL/WRF_cpl/

Contains the WRF-Hydro coupling interface for

coupling WRF-Hydro components with the WRF system

CPL/CLM_cpl/, CPL/LIS_cpl/, CPL/NUOPC_cpl/

Work in progress for ongoing coupling work. Not actively supported.

Data_Rec/

Contains some data declaration modules

Debug_Utilities

Utilities for debugging

deprecated/

Contains files not currently used

Doc/

Pointer to location of full documentation (i.e. this document).

HYDRO_drv/

Contains the high-level WRF-Hydro component driver: module_HYDRO_drv.F

Land_models/Noah/

Contains the Noah land surface model driver for standalone or uncoupled applications

La nd_models/NoahMP/

Contains the Noah-MP land surface model driver for standalone or uncoupled applications

MPP/

Contains MPI parallelization routines and functions

nudging/

Contains nudging data assimilation routines and functions

Rapid_routing/

Contains the files necessary for RAPID routing model coupling. Unsupported as version of RAPID is out of date.

Routing/

Contains modules and drivers related to specific routing processes in WRF-Hydro

template/

Contains example namelist files for Noah, Noah-MP and the WRF-Hydro modules (HYDRO). Default and example parameter tables are also included for HYDRO. Note: Parameter tables for Noah and Noah-MP are stored within the Land_models directory. A sample bash script (setEnvar.sh) that could be passed to the compile script listing compile time options for WRF-Hydro is also located here.

Table 2.1 continued

utils

internal model versioning

Files:

compi le_offline_Noah.sh

Script for compiling WRF-Hydro standalone (offline) version with the Noah land surface model

compile _offline_NoahMP.sh

Script for compiling WRF-Hydro standalone (offline) version with the Noah-MP land surface model

configure

Script to configure the WRF-Hydro compilation

Makefile

The top-level makefile for building and cleaning WRF-Hydro code

README.build.txt

WRF-Hydro build instructions for the standalone model

wrf_hydro_config

Configure script for coupled WRF | WRF-Hydro configuration

*.json

JSON files used for testing

Local files and directories created on configure/compile (not part of the version controlled repository):

Directories:

lib/

Directory where compiled libraries are written

mod/

Directory where compiled .mod files are written upon compilation

LandModel/

A symbolic link set by the configure script, which selects the land model from Land_models/ (e.g. LandModel -> Land_models/NoahMP when Noah-MP is selected)

LandModel_cpl/

A symbolic link set by the configure script, which selects the land model from CPL/. (e.g. LandModel_cpl -> CPL/NoahMP_cpl when NoahMP is selected)

Run/

Directory where model executable, example parameter tables, and example namelist files for the compiled model configuration will be populated. These files will be overwritten on compile. It is recommended the user copy the contents of this directory into an alternate location, separate from the code, to execute model runs.

Files:

macros

Macro definition file created by the ‘configure’ script that specifies compilation settings.

Table 2.2 Modules within the Routing/ directory which relate to routing processes in WRF-Hydro

File/directory name

Description

Overland/

directory containing overland routing modules

Makefile

Makefile for WRF-Hydro component

module _channel_routing.F

Module containing WRF-Hydro channel routing components

module_d ate_utilities_rt.F

Module containing various date/time utilities for routing routines

mo dule_GW_baseflow.F

Module containing model physics for simple baseflow model

module_HYDRO_io.F

Module containing WRF-Hydro input and (some) output functions

mo dule_HYDRO_utils.F

Module containing several WRF-Hydro utilities

mo dule_lsm_forcing.F

Module containing the options for reading in different forcing data types

module_noah_ch an_param_init_rt.F

Module containing routines to initialize WRF-Hydro routing grids

module_NWM_io.F

Module containing output routines to produce CF-compliant desired output files.

mo dule_NWM_io_dict.F

Dictionary to support CF-compliant output routines.

module_RT.F

Module containing the calls to all the WRF-Hydro routing initialization

module_UDMAP.F

Module for the user-defined mapping capabilities, currently used for NWM configuration (NHDPlus network)

No ah_distr_routing.F

Module containing overland flow and subsurface physics routines and grid disaggregation routine

module_gw_gw2d.F

Module containing routines for the experimental 2D groundwater model

2.5 Model Sequence of Operations

The basic structure and sequencing of WRF-Hydro are diagrammatically illustrated in Figure 2.4. High-level job management (e.g. time management, initialization, I/O and model completion) is handled by the WRF-Hydro system unless WRF-Hydro is coupled into, and beneath, a different modeling architecture. The WRF-Hydro system can either call an independent land model driver such as the NCAR High Resolution Land Data Assimilation System (HRLDAS) for both Noah and Noah-MP land surface models to execute column land surface physics or be called by a different modeling architecture such as WRF, the NCAR CESM, or the NASA LIS. When run in a standalone or “uncoupled” mode, WRF-Hydro must read in the meteorological forcing data necessary to perform land surface model calculations and it contains the necessary routines to do this. When run in a coupled mode with WRF or another larger architecture, WRF-Hydro receives meteorological forcing or land surface states and fluxes from the parent architecture. The basic execution process is as follows:

Upon initialization static land surface physiographic data are read into the WRF-Hydro system and the model domain and computational arrays are established.

Depending on whether or not WRF-Hydro is run offline as a standalone system or whether it is coupled into another architecture, either forcing data is read in or land surface states and fluxes are passed in.

For offline simulations which require land model execution, the gridded column land surface model is executed.

If routing is activated and there is a difference between the land model grid and the routing grid, land surface states and fluxes are then disaggregated to the high-resolution terrain routing grids.

If activated, sub-surface routing physics are executed.

If activated, surface routing physics are executed.

If activated, the conceptual base flow model is executed.

If activated, channel and reservoir routing components are executed. Streamflow nudging is currently available to be applied within the Muskingum-Cunge routing call.

Updated land surface states and fluxes are then aggregated from the high-resolution terrain routing grid to the land surface model grid (if routing is activated and there is a difference between the land model grid and the routing grid).

Results from these integrations are then written to the model output files and restart files or, in the case of a coupled WRF/WRF-Hydro simulation, passed back to the WRF model.

As illustrated at the bottom of the Figure 2.4, a data assimilation component with NCAR’s DART (https://www.image.ucar.edu/DAReS/DART/) has been developed. This currently only works with WRF-Hydro in standalone mode. DART updates WRF-Hydro states independently of model time integration.

Figure 2.4 Modular calling structure of WRF-Hydro.

2.6 WRF-Hydro compile-time options

Compile time options are choices about the model structure which are determined when the model is compiled. Compile time choices select a WRF-Hydro instance from some of the options illustrated in Figure 2.4. Compile time options fall into two categories: 1) what is the selected model driver, and 2) what are the compile options for the choice of driver. In this guide we limit the description of model drivers to WRF, Noah, and Noah-MP. Configuring, compiling, and running WRF-Hydro in standalone mode is described in detail in the How To Build & Run WRF-Hydro V5 in Standalone Mode document available from https://ral.ucar.edu/projects/wrf_hydro.

Compile time options are listed in the repository in the file trunk/NDHMS/template/setEnvar.sh. The information in that script is annotated here. As the path implies, this script is a template file. Should you choose to edit it, please make a copy for your needs (not under version control). The compile script in trunk/NDHMS/compile*sh can accept the variables listed in this file if they are set in the user’s environment. The user can also optionally supply a file containing the variables as the first argument to the script. If make is used for rebuilding the code, it is recommended that the user put the variables into their environment (e.g. in bash, by sourcing the file containing the variables) before running the compile step. Below is an example bash script with the compile time options described. The descriptions are indicated with



【本文地址】

公司简介

联系我们

今日新闻


点击排行

实验室常用的仪器、试剂和
说到实验室常用到的东西,主要就分为仪器、试剂和耗
不用再找了,全球10大实验
01、赛默飞世尔科技(热电)Thermo Fisher Scientif
三代水柜的量产巅峰T-72坦
作者:寞寒最近,西边闹腾挺大,本来小寞以为忙完这
通风柜跟实验室通风系统有
说到通风柜跟实验室通风,不少人都纠结二者到底是不
集消毒杀菌、烘干收纳为一
厨房是家里细菌较多的地方,潮湿的环境、没有完全密
实验室设备之全钢实验台如
全钢实验台是实验室家具中较为重要的家具之一,很多

推荐新闻


图片新闻

实验室药品柜的特性有哪些
实验室药品柜是实验室家具的重要组成部分之一,主要
小学科学实验中有哪些教学
计算机 计算器 一般 打孔器 打气筒 仪器车 显微镜
实验室各种仪器原理动图讲
1.紫外分光光谱UV分析原理:吸收紫外光能量,引起分
高中化学常见仪器及实验装
1、可加热仪器:2、计量仪器:(1)仪器A的名称:量
微生物操作主要设备和器具
今天盘点一下微生物操作主要设备和器具,别嫌我啰嗦
浅谈通风柜使用基本常识
 众所周知,通风柜功能中最主要的就是排气功能。在

专题文章

    CopyRight 2018-2019 实验室设备网 版权所有 win10的实时保护怎么永久关闭